dyad

Dyad is a free, local, open-source AI app builder

Details
Cherry Studio

A desktop client that supports for multiple LLM providers

Details
Witsy

Desktop AI Assistant / Universal MCP Client

Details
AI as Workspace

A better AI (LLM) client. Full-featured, lightweight. Support multiple workspaces, plugin system, cross-platform, local first + real-time cloud sync, Artifacts, MCP

Details
ollamate

Ollama chat client powered by Tauri & Next.js

Details
GPTranslate

Fast, modern desktop translation application

Details
Dive

An open-source MCP Host Desktop Application that seamlessly integrates with any LLMs supporting function calling capabilities.

Details
Algernon

Small self-contained pure-Go web server with Lua, Markdown, HTTP/2, QUIC, Redis and PostgreSQL support

Details
Tome

A desktop LLM client with magical MCP support

Details
Gonzo

Gonzo! The Go based TUI log analysis tool

Details
Hollama

A minimal LLM chat app that runs entirely in your browser

Details
AingDesk

Run AI models on your desktop with one click.

Details
Nekot

A portable terminal AI interface

Details
LobeHub-Beta

An open-source, modern-design AI chat framework. Supports Multi AI Providers (OpenAI / Claude 4 / Gemini / Ollama / DeepSeek / Qwen), Knowledge Base (file upload / knowledge management / RAG), Multi-Modals (Plugins/Artifacts) and Thinking.

Details
Khoj

Your AI second brain. Self-hostable. Get answers from the web or your docs. Build custom agents, schedule automations, do deep research. Turn any online or local LLM into your personal, autonomous AI (gpt, claude, gemini, llama, qwen, mistral).

Details
Kun Avatar

基于 Ollama 推理框架本地部署的 Agent 应用,实现 MCP 工具调用,短长期记忆等功能。

Details
Ollama Operator

Yet another operator for running large language models on Kubernetes with ease. Powered by Ollama! 🐫

Details
ollama-desktop

Ollama Desktop is a GUI tool for running and managing Ollama models.

Details
Chatless

开源、轻量级、现代化。本地 AI 对话客户端,支持多提供商与本地模型。

Details
5ire

A Sleek AI Assistant & MCP Client

Details
Harbor

Effortlessly run LLM backends, APIs, frontends, and services with one command.

Details